Low-rank tensor approximation for high-order correlation functions of Gaussian random fields

نویسندگان

  • Daniel Kressner
  • Rajesh Kumar
  • Fabio Nobile
  • Christine Tobler
چکیده

Gaussian random fields are widely used as building blocks for modeling stochastic processes. This paper is concerned with the efficient representation of d-point correlations for such fields, which in turn enables the representation of more general stochastic processes that can be expressed as a function of one (or several) Gaussian random fields. Our representation consists of two ingredients. In the first step, we replace the random field by a truncated Karhunen-Loève expansion and analyze the resulting error. The parameters describing the d-point correlation can be arranged in a tensor, but its storage grows exponentially in d. To avoid this, the second step consists of approximating the tensor in a low-rank tensor format, the so called Tensor Train decomposition. By exploiting the particular structure of the tensor, an approximation algorithm is derived that does not need to form this tensor explicitly and allows to process correlations of order as high as d = 20. The resulting representation is very compact and its use is illustrated for elliptic partial differential equations with random Gaussian forcing terms.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Tensor Regression Meets Gaussian Processes

Low-rank tensor regression, a new model class that learns high-order correlation from data, has recently received considerable attention. At the same time, Gaussian processes (GP) are well-studied machine learning models for structure learning. In this paper, we demonstrate interesting connections between the two, especially for multi-way data analysis. We show that low-rank tensor regression i...

متن کامل

Low rank tensor recovery via iterative hard thresholding

We study extensions of compressive sensing and low rank matrix recovery (matrix completion) to the recovery of low rank tensors of higher order from a small number of linear measurements. While the theoretical understanding of low rank matrix recovery is already well-developed, only few contributions on the low rank tensor recovery problem are available so far. In this paper, we introduce versi...

متن کامل

Low-rank quadrature-based tensor approximation of the Galerkin projected Newton/Yukawa kernels

Tensor-product approximation provides a convenient tool for efficient numerical treatment of high dimensional problems that arise, in particular, in electronic structure calculations in Rd. In this work we apply tensor approximation to the Galerkin representation of the Newton and Yukawa potentials for a set of tensor-product, piecewise polynomial basis functions. To construct tensor-structured...

متن کامل

Multilevel tensor approximation of PDEs with random data

In this paper, we introduce and analyze a new low-rank multilevel strategy for the solution of random diffusion problems. Using a standard stochastic collocation scheme, we first approximate the infinite dimensional random problem by a deterministic parameter-dependent problem on a high-dimensional parameter domain. Given a hierarchy of finite element discretizations for the spatial approximati...

متن کامل

Efficient low-rank approximation of the stochastic Galerkin matrix in tensor formats

In this article we describe an efficient approximation of the stochastic Galerkin matrix which stems from a stationary diffusion equation. The uncertain permeability coefficient is assumed to be a log-normal random field with given covariance and mean functions. The approximation is done in the canonical tensor format and then compared numerically with the tensor train and hierarchical tensor f...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2014